A pr 1 99 9 Determination of the proper embedding parameters for noisy time series
نویسنده
چکیده
We suggest an algorithm for determining the proper delay time and the minimum embedding dimension for Takens’ delay-time embedding procedure. This method resorts to the rate of change of the spatial distribution of points on a reconstructed attractor with respect to the delay time, and can be successfully applied to a noisy time series which is too noisy to be discriminated from a structureless noisy time series by means of the correlation integral, and also indicates that the proper delay time depends on the embedding dimension. PACS numbers : 05.45.+b Electronic mail: [email protected] Electronic mail: [email protected] Since the discovery of the method to reconstruct state spaces from a time series [1,2], various attempts have been implemented to choose the embedding parameters, the proper delay time and the minimum embedding dimension [3–10]. By the theorem of Takens [2], the reconstructions are generically diffeomorphic to the original dynamics if d > 2dA where dA is the dimension of the compact invariant smooth manifold A corresponding to the continuous time dynamical system and d is the embedding dimension. From the theorem, one can not, however, determine the delay time and the minimum embedding dimension. Nevertheless because the quality of the reconstructed attractor affects strongly the success of noise reduction processes [11] and forecasting methods as well as accurate calculations of various invariant quantities such as Lyapunov exponents, it is essential to choose the proper embedding parameters before carrying out the above mentioned jobs. In computational analysis with a noisy time series, because what the proper delay time is depends on the purpose for reconstructing attractors, it is suggested that the delay time is much less critical to success of the reconstruction process than the embedding dimension. In general, with the chosen delay time too small, the reconstructed points coincide approximately within the range of errors so that the reconstructed attractor appears “stretched out and randomly crowded” along the identity line in the time-delayed coordinates. If the delay time is too large, operations such as Grassberger-Procaccia algorithm [12] for the correlation dimension can be lead to wrong results [13] due to the nonlinearity of the embedding process not of the original dynamics. On the other hand, working in any larger dimension than the minimum leads to a more computational burden in obtaining the invariant quantities such as Lyapunov exponents, prediction, etc. And it also enhances the problem of contamination by round-off or measurement errors since the noises will populate and dominate the additional dimensions of the embedding space. The most important reason for looking for the minimum embedding dimension is to obtain an intuition for understanding and modeling the underlying dynamics of a noisy time series. Also in many schemes for noise reduction, the first job to do is to have at least a vague idea about the minimum embedding dimension [11]. In this paper, we propose a method to determine simultaneously the minimum embedding dimension dM and the proper delay time T d d for a noisy time series. The suggested method can be applied successfully to a noisy time series contaminated by a measurement noise of which the size is so much large that one can not disentangle the deterministic part from the random part by means of the correlation integral [14]. Also on the basis of this method, we find that the proper window length W d ≡ (d − 1)T d d remains approximately constant if d ≥ dM . Let a continuous signal w(t) be measured at each “sampling interval” Ts to yield a time series {v(k)}: {v(k)} ≡ {v(k) | v(k) = w(kTs), k = 1, 2, · · · , NT}, (1) where NT is the total number of time series. We then assign to ~x(t) the delay vector: ~y(p|d, Td) = (v(1 + (p− 1)j), v(1 + (p− 1)j + Td), · · · , v(1 + (p− 1)j + (d− 1)Td)) (2) p = 1, 2, · · · , N In ~y(p|d, Td), d and Td represent the embedding dimension and the delay time in the unit of the sampling time Ts, respectively. j is the interval in the unit of sampling time Ts
منابع مشابه
Determination of the proper embedding parameters for noisy time series
We suggest an algorithm for determining the proper delay time and the minimum embedding dimension for Takens’ delay-time embedding procedure. This method resorts to the rate of change of the spatial distribution of points on a reconstructed attractor with respect to the delay time, and can be successfully applied to a noisy time series which is too noisy to be discriminated from a structureless...
متن کاملModel Based Method for Determining the Minimum Embedding Dimension from Solar Activity Chaotic Time Series
Predicting future behavior of chaotic time series system is a challenging area in the literature of nonlinear systems. The prediction's accuracy of chaotic time series is extremely dependent on the model and the learning algorithm. On the other hand the cyclic solar activity as one of the natural chaotic systems has significant effects on earth, climate, satellites and space missions. Several m...
متن کاملChaotic Analysis and Prediction of River Flows
Analyses and investigations on river flow behavior are major issues in design, operation and studies related to water engineering. Thus, recently the application of chaos theory and new techniques, such as chaos theory, has been considered in hydrology and water resources due to relevant innovations and ability. This paper compares the performance of chaos theory with Anfis model and discusses ...
متن کاملRiver Discharge Time Series Prediction by Chaos Theory
The application of chaos theory in hydrology has been gaining considerable interest in recent years.Based on the chaos theory, the random seemingly series can be attributed to deterministic rules. Thedynamic structures of the seemingly complex processes, such as river flow variations, might be betterunderstood using nonlinear deterministic chaotic models than the stochastic ones. In this paper,...
متن کاملA Novel Method for Detection of Epilepsy in Short and Noisy EEG Signals Using Ordinal Pattern Analysis
Introduction: In this paper, a novel complexity measure is proposed to detect dynamical changes in nonlinear systems using ordinal pattern analysis of time series data taken from the system. Epilepsy is considered as a dynamical change in nonlinear and complex brain system. The ability of the proposed measure for characterizing the normal and epileptic EEG signals when the signal is short or is...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1999